Notes on the Cramér-Rao Inequality
نویسنده
چکیده
Suppose X is a random variable with pdf fX(x; θ), θ being an unknown parameter. Let X1, . . . , Xn be a random sample and θ̂ = θ̂(X1, . . . , Xn). We’ve seen that E(θ̂), or rather E(θ̂) − θ, is a measure of how biased θ̂ is. We’ve also seen that V ar(θ̂) provides a measure of efficiency, i.e., the smaller the variance of θ̂, the more likely E(θ̂) will provide an accurate estimate of θ. Given a specific unbiased estimator θ̂, how do we know if it is the best (most efficient, i.e., smallest variance) one, or if there is a better one? A key tool in understanding this question is a theoretical lower bound on how small V ar(θ̂) can be. This is the Cramér-Rao Inequality. From now on, we assume X is continuous and θ is a single real parameter (i.e., there is only one unknown). We will also assume the range of X does not depend on θ. To be more precise, we will assume there exist a, b ∈ R ∪ {±∞} independent of θ such that { fX(x; θ) > 0 if a < x < b fX(x; θ) = 0 if x < a or x > b.
منابع مشابه
Generalization of Cramér-Rao and Bhattacharyya inequalities for the weighted covariance matrix
The paper considers a family of probability distributions depending on a parameter. The goal is to derive the generalized versions of Cramér-Rao and Bhattacharyya inequalities for the weighted covariance matrix and of the Kullback inequality for the weighted Kullback distance, which are important objects themselves [9, 23, 28]. The asymptotic forms of these inequalities for a particular family ...
متن کاملCramér-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information
The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The CramérRao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are ...
متن کاملImproved Cramer-Rao Inequality for Randomly Censored Data
As an application of the improved Cauchy-Schwartz inequality due to Walker (Statist. Probab. Lett. (2017) 122:86-90), we obtain an improved version of the Cramer-Rao inequality for randomly censored data derived by Abdushukurov and Kim (J. Soviet. Math. (1987) pp. 2171-2185). We derive a lower bound of Bhattacharya type for the mean square error of a parametric function based on randomly censor...
متن کاملWhen the Cramér-rao Inequality Provides No Information
We investigate a one-parameter family of probability densities (related to the Pareto distribution, which describes many natural phenomena) where the Cramér-Rao inequality provides no information. 1. Cramér-Rao Inequality One of the most important problems in statistics is estimating a population parameter from a finite sample. As there are often many different estimators, it is desirable to be...
متن کاملOn multidimensional generalized Cramér-Rao inequalities, uncertainty relations and characterizations of generalized q-Gaussian distributions
In the present work, we show how the generalized Cramér-Rao inequality for the estimation of a parameter, presented in a recent paper, can be extended to the mutidimensional case with general norms on Rn, and to a wider context. As a particular case, we obtain a new multidimensional Cramér-Rao inequality which is saturated by generalized q-Gaussian distributions. We also give another related Cr...
متن کامل